PyCM Report

Dataset Type :

Note 1 : Recommended statistics for this type of classification highlighted in aqua

Note 2 : The recommender system assumes that the input is the result of classification over the whole data rather than just a part of it. If the confusion matrix is the result of test data classification, the recommendation is not valid.

Confusion Matrix (Normalized):

Actual Predict
L1 L2 L3
L1 0.6 0.0 0.4
L2 0.0 0.5 0.5
L3 0.0 0.4 0.6

Overall Statistics :

ACC Macro 0.72222
F1 Macro 0.56515
FPR Macro 0.20952
Kappa 0.35484
NPV Macro 0.77778
Overall ACC 0.58333
PPV Macro 0.61111
SOA1(Landis & Koch) Fair
TPR Macro 0.56667
Zero-one Loss 5

Class Statistics :

Class L1 L2 L3 Description
ACC 0.83333 0.75 0.58333 Accuracy
AUC 0.8 0.65 0.58571 Area under the ROC curve
AUCI Very Good Fair Poor AUC value interpretation
F1 0.75 0.4 0.54545 F1 score - harmonic mean of precision and sensitivity
TPR 0.6 0.5 0.6 Sensitivity, recall, hit rate, or true positive rate
FPR 0.0 0.2 0.42857 Fall-out or false positive rate
PPV 1.0 0.33333 0.5 Precision or positive predictive value
TP 3 1 3 True positive/hit
FP 0 2 3 False positive/type 1 error/false alarm
FN 2 1 2 False negative/miss/type 2 error
TN 7 8 4 True negative/correct rejection
N 7 10 7 Condition negative
P 5 2 5 Condition positive or support
POP 12 12 12 Population
TOP 3 3 6 Test outcome positive
TON 9 9 6 Test outcome negative

Generated By PyCM Version 4.0